Ext-ICAS: A Novel Self-Normalized Extractive Intra Cosine Attention Similarity Summarization
نویسندگان
چکیده
With the continuous growth of online news articles, there arises necessity for an efficient abstractive summarization technique problem information overloading. Abstractive is highly complex and requires a deeper understanding proper reasoning to come up with its own summary outline. task framed as seq2seq modeling. Existing methods perform better on short sequences; however, long sequences, performance degrades due high computation hence two-phase self-normalized deep neural document model consisting improvised extractive cosine normalization phases has been proposed in this paper. The novelty parallelize sequence training by incorporating feed-forward, network Extractive phase using Intra Cosine Attention Similarity (Ext-ICAS) sentence dependency position. Also, it does not require any explicitly. Our Bidirectional Long Short Term Memory (Bi-LSTM) encoder performs than Gated Recurrent Unit (Bi-GRU) minimum loss fast convergence. was evaluated Cable News Network (CNN)/Daily Mail dataset average rouge score 0.435 achieved also computational reduced 59% number similarity computations.
منابع مشابه
Extractive Summarization using Inter- and Intra- Event Relevance
Event-based summarization attempts to select and organize the sentences in a summary with respect to the events or the sub-events that the sentences describe. Each event has its own internal structure, and meanwhile often relates to other events semantically, temporally, spatially, causally or conditionally. In this paper, we define an event as one or more event terms along with the named entit...
متن کاملEvent-Based Extractive Summarization
Most approaches to extractive summarization define a set of features upon which selection of sentences is based, using algorithms independent of the features themselves. We propose a new set of features based on low-level, atomic events that describe relationships between important actors in a document or set of documents. We investigate the effect this new feature has on extractive summarizati...
متن کاملExtractive Document Summarization
We present two novel and contrasting Recurrent Neural Network (RNN) based architectures for extractive summarization of documents. The Classifier based architecture sequentially accepts or rejects each sentence in the original document order for its membership in the final summary. The Selector architecture, on the other hand, is free to pick one sentence at a time in any arbitrary order to pie...
متن کاملLanguage Independent Extractive Summarization
We demonstrate TextRank – a system for unsupervised extractive summarization that relies on the application of iterative graphbased ranking algorithms to graphs encoding the cohesive structure of a text. An important characteristic of the system is that it does not rely on any language-specific knowledge resources or any manually constructed training data, and thus it is highly portable to new ...
متن کاملAugmenting Neural Sentence Summarization Through Extractive Summarization
Neural sequence-to-sequence model has achieved great success in abstractive summarization task. However, due to the limit of input length, most of previous works can only utilize lead sentences as the input to generate the abstractive summarization, which ignores crucial information of the document. To alleviate this problem, we propose a novel approach to improve neural sentence summarization ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computer systems science and engineering
سال: 2023
ISSN: ['0267-6192']
DOI: https://doi.org/10.32604/csse.2023.027481